翻訳と辞書 |
Method of moments (probability theory) : ウィキペディア英語版 | Method of moments (probability theory)
In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences. Suppose ''X'' is a random variable and that all of the moments : exist. Further suppose the probability distribution of ''X'' is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If : for all values of ''k'', then the sequence converges to ''X'' in distribution. The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé. More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices. ==Notes==
抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Method of moments (probability theory)」の詳細全文を読む
スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース |
Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.
|
|